Kestrel-3

Apple ruined computing for all time
Login

Computing as an Appliance: Apple's Contribution

or, How Apple Ruined Computing for All Time

As people know by now, I have been calling the PC industry an appliance industry pretty much since around 2007. I've tried to justify this position in several ways over the years, but with mixed results. I mean, it should come as no surprise when the following events all occurred roughly at the same time:

As a result, Microsoft needed a platform which was tightly controlled by them:

Given those two principles, Microsoft controlled every mechanism by which a casual user would put software on the device, thus guaranteeing they had a revenue stream.

While I can't fault them for wanting to ensure continuing revenue, I can certainly fault them for failing to be very imaginative in addressing customer's needs. Many customers neither wanted nor asked for an appliance form-factor, they wanted enhanced core technology instead. While that came eventually (preemptive multitasking came to Windows more than 10 years after the Commodore-Amiga's introduction to the market, while useful memory protection features didn't arrive in Windows until the year 2000!), there's no denying that the big selling point for the Windows and Office products in particular, and many 3rd party products designed to be compatible with them, was visual glitz and how intuitive the user interface was intended to be (but rarely was). To this day, products are often sold with no meaningful change in utility, but solely with different user interface layouts. When was the last time you upgraded Office, and realized you could actually use some new feature in Word to your benefit? Every document I write today can still be written in Word for Windows 3.1. Once you have the perfect (or, at least, a good enough) product, how else do you compete? Vis-a-vis, automotive manufacturers.

I'm happy to see others are coming to similar conclusions regarding computing-as-an-appliance, and doing so with very different data sets. About a week ago, I saw a toot-stream (sources at the end of this article) which provides a cogent and poignant chronicle of what happened in the industry. With the author's permission, I reproduce their essay below. Please note: I reproduce it as it was typed. I've made no effort to significantly edit the material. Spelling, capitalization, and punctuation errors are left as-is. Mastodon, the social media interface I drew the content from, imposes content limits like Twitter (just bigger), and much of the content's unconventional structure stems from those limitations.

Mona Drafter writes,

Apple ruined computing for all time

before talking about how Apple inflicted a mortal wound on personal computing, I want to talk about something more mundane and familiar: the consumer good. the appliance.

take, for example, the electric stove. I have used probably at least a dozen of these in my life, all provided with the house or apartment, and they have all been the same.

you get some resistive heating coils (which usually don't even support a pot levelly, and which make contact with the pot at only two or three points) controlled by an "open-loop" circuit, i.e. with no feedback for regulation of temperature, just the war between resistive heating and heat loss through radiation and conduction, which varies widely upon conditions, so you can never be assured of even, reproducible heating. it's rubbish.

I do not know exactly how old this design for an electric stove is, but it dates back approximately a century, perhaps more. and it is almost the only model of electric stove available to the general public.

purchasing more expensive models of the electric stove gets you some extra polish. you might get a layer of glass over the heating coils. the panel controls are slicker. overall build quality might be better

but the same stove.

this is the general pattern with consumer goods under capitalism. to get any real difference you need to look outside the consumer market, to far more expensive devices sold to professional cooks. (I'm aware of inductive cooktops but I do not consider these a replacement, for they are not for general purpose heating of cookware.)

what Apple tried to do, what has doomed us today, is to drag personal computers into this dead zone.

I am old enough to remember the previous generation of personal computers, before Apple;s Macintosh set the tone for the rest of the century and beyond--machines like the Commodore-64 and the older-generation Apples: they were both instantly usable and instantly programmable.

there was nothing you couldn't do on your C-64 right out of the box. it came with complete technical specifications. AND you could just run apps on it.

Apple, quite deliberately, set out to wreck this. they wished to make a personal computer that was like a "white good", like an electric stove, frozen in place. out of the box, the Macintosh was a completely inflexible machine.

programming the Macintosh originally required the "Macintosh Programmer's Workshop", which I attempted to use once (I have done some System 7 programming though it was a long time ago.)

MPW was a nightmare.

MPW was a hideously expensive, strange software development environment that was a sort of own-brand command-line environment without the advantage of UNIX conciseness, and what seemed like almost wilfully obtuse syntax. (the "curly-d" partial differential symbol was an important token for some reason.)

eventually third-party developers would supply better and more affordable environments but the initial message was clear

by making the only way to program the Macintosh a byzantine horror, and charging hundreds of dollars for it, Apple was making it quite clear that whatever ideas they were purporting to extol about ease of use in personal computing, ideas mostly stolen from Xerox, they did NOT want those ideas to apply to programming. they wanted that to be difficult and arcane

they wanted to freeze computing in place, and dictate its course

it did not have to be this way. ironically, Apple gave us at least an indication of how it could have been different, with HyperCard

here was a development environment that was actually consistent with the Macintosh user experience and not some bizarre command-line monstrosity running parallel to it

but it was rather a toy, albeit a flexible one, and like all of Apple's really good ideas, they abandoned it

Apple's fatal sundering of usability and programmability has informed the entire personal computing world since then, corrupting and ruining it

they froze interface development. everything since has just been copies of copies of copies of that original Xerox work

more importantly, everyone copied how Apple sold and marketed their devices. everyone wanted to make personal computers into mere appliances after Apple.

the Apple philosophy infected the smart phone, and as a consequence smart phones are some of the worst consumer devices imaginable, inflexible and infuriating, loaded with unnecessary software that can't be removed, programmable only with great difficulty. the Android SDK is a deliberate disaster. it's not meant to be usable by the ordinary human.

and casting a pall over the entire industry is a lesson ground into us from childhood now, reinforced by a toxic computer-geek community: This Is The Way It Must Be

nobody can imagine personal computers working any differently now

we're just supposed to live with this wretched state of affairs that Apple and Steve Jobs helped to create, decades ago.

well...I'm dedicated to putting an end to it.

this is not my field. I am a physical scientist, a chemist. I did not want to spend the waning years of my life and my intellect fighting with computers, but I think it's necessary. more than ever

we need to wrest these powerful and wonderful machines away from their current masters and give them back to the people, all people, not just a priesthood of computer geeks

I'd like to thank Mona Drafter for letting me collect their essay and reproduce it in full on my website.

Sources for the individual messages: